Skip to content

[tests] refactor autoencoderkl tests#13368

Open
sayakpaul wants to merge 9 commits intomainfrom
autoencoderkl-tests-refactor
Open

[tests] refactor autoencoderkl tests#13368
sayakpaul wants to merge 9 commits intomainfrom
autoencoderkl-tests-refactor

Conversation

@sayakpaul
Copy link
Copy Markdown
Member

No description provided.

@sayakpaul sayakpaul requested a review from DN6 March 30, 2026 09:49
@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@github-actions github-actions Bot added tests size/M PR with diff < 200 LOC labels Apr 12, 2026
@sayakpaul
Copy link
Copy Markdown
Member Author

@DN6 a gentle ping.

Comment on lines 344 to +362
@@ -362,10 +359,7 @@ def test_stable_diffusion_decode_xformers_vs_2_0_fp16(self, seed):

@parameterized.expand([(13,), (16,), (37,)])
@require_torch_gpu
@unittest.skipIf(
not is_xformers_available(),
reason="xformers is not required when using PyTorch 2.0.",
)
@pytest.mark.skipif(not is_xformers_available(), reason="xformers is not required when using PyTorch 2.0.")
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why don't we remove these xformers tests? They're not as relevant and we rely on set_attention_backend for this functionality.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we support set_attention_backend() on AutoencoderKL? I don't think we do.

I don't think this was a merge blocker 😅

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size/M PR with diff < 200 LOC tests

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants